Recurrent Autoassociative Networks and Holistic Computations

نویسنده

  • Ivelin Stoianov
چکیده

The paper presents an experimental study of holistic computations over distributed representations (DRs) of sequences developed by the Recurrent Autoassociative Networks (RAN). Three groups of holistic operators are studied: extracting symbols at fixed position, extracting symbols at a variable position and reversing strings. The success with those operators and the very good generalization pave the road for holistic linguistic transformations. Also, it brings a better understanding of the structure of the DRs developed by the RANs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Autoassociative Neural Network Model of Paired-Associate Learning

Hebbian heteroassociative learning is inherently asymmetric. Storing a forward association, from item A to item B, enables recall of B (given A), but does not permit recall of A (given B). Recurrent networks can solve this problem by associating A to B and B back to A. In these recurrent networks, the forward and backward associations can be differentially weighted to account for asymmetries in...

متن کامل

Holographic Recurrent Networks

Holographic Recurrent Networks (HRNs) are recurrent networks which incorporate associative memory techniques for storing sequential structure. HRNs can be easily and quickly trained using gradient descent techniques to generate sequences of discrete outputs and trajectories through continuous space. The performance of HRNs is found to be superior to that of ordinary recurrent networks on these ...

متن کامل

Network Capacity for Latent Attractor Computation

Attractor networks have been one of the most successful paradigms in neural computation and have been used as models of computation in the nervous system Many experimentally observed phenomena such as coherent population codes contextual representations and replay of learned neural activity patterns are explained well by attractor dynamics Recently we proposed a paradigm called latent attractor...

متن کامل

High Order Neural Networks for Efficient Associative Memory Design

We propose learning rules for recurrent neural networks with high-order interactions between some or all neurons. The designed networks exhibit the desired associative memory function: perfect storage and retrieval of pieces of information and/or sequences of information of any complexity.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000